Fine-tuning my misinformation detector

Date:

I was at my best friend’s house, surrounded by her relatives, we all were talking about a myriad of topics and suddenly she made a weird comment. “Do you know that plants die when given water heated in a microwave?” My reaction was of laughter and ridicule considering the overall implausibility of the claim. She was committed to defending her claim and we started throwing arguments. She didn’t remember the source (looking at you WhatsApp), was uncertain about the mechanisms via which such a phenomenon could be explained, and certainly hadn’t tried it. She didn’t concede though, her counterclaims included things like maybe there is some unknown mechanism and if I am too skeptical I should go out and try it myself. Soon her family joined in too who collectively labeled me a skeptic who needs research papers to trust things. This got me thinking, was I actually wrong? So I went online and searched for this supposed phenomenon and found nada to support it. A similar thing happened while watching a youtube video of a supposed spiritual leader who cited research done by his own foundation to support his claims of planetary motion affecting our daily lives.

In both of the cases, my misinformation detector went off because I knew that they cannot be true based on my knowledge of physics and biology. Water heated by microwave doesn’t magically change chemical composition to become poisonous for plants and planets don’t affect your lives. The real question is what other facts I swallowed because my misinformation detector laid dormant. The pattern for detection was my biases, I distrusted astrology, and tend to trust modern technologies, so the pattern for misses has to be the same. Things that seemed correctly aligned with my worldview, political and social believes might have been missed. I guess a similar thing happened with my friend, whose mistrust of modern contraptions and insistence on old ways might have made her ignore her really substantial knowledge of botany and physics.

I must have my blind spots too, relying on evidence-based veracity detection is not always possible and it still leaves me open for improper research, corporately funded research biases and makes me vulnerable to people who sound like experts. Most importantly relying on evidence doesn’t work on opinions. Opinions like this cake is good could be subjectively true and claims like religion is good for society are hard to assess in few seconds. So I thought of coming up with another filter that takes these things into account. “Cigarette filters make smoking safer” is a good example of such misinformation. Sounds legit, is repeatedly told by smart people who smoke, and is an utterly made-up marketing ploy. They slightly reduce the tar deposition in your lungs but don’t make a huge difference overall. How about advice given by gym trainers to exercise a particular body part to reduce fat content around it, utter misinformation.

So the first lesson for me is, don’t trust people who sound smart or expert without some basic inquiry. How does it work? Why does it happen this way? If they don’t demonstrate a deeper understanding of the subject they might be knowingly or unknowingly misinforming me. This still leaves the first example untouched, a friend who knows far more about botany than I do could not be trusted on botany facts. So the second filter has to be checking for biases and possible motives for the messenger who is bestowing the knowledge onto me. This is certainly hard and will need more practice. Still not a perfect art but this does make a significant improvement in my current strategy.